Concerns About How Facebook And Other DNC Social Media Giants Manipulate The News Online

Listen

Concerns About How Facebook And Other Social Media Giants Highlight News Online

Menu

Guest Host: Rachel Martin

Pexels

Allegations surfaced this week of political bias by Facebook — where an estimated 30 percent of American adults get news. A technology blogger — citing anonymous sources — wrote that Facebook suppresses news of interest to conservative readers in its list of top trending stories. Facebook strongly denies it, but the allegations reignited concerns about the potential for manipulation by online media companies like Facebook, Twitter and Google. Lawmakers and social media scholars are calling for greater transparency. Guest host Rachel Martin and her guests talk about trends in how news is curated and consumed online — and the political and social implications.

Guests

Transcript

MS. RACHEL MARTINThanks for joining us. I'm Rachel Martin with NPR's "Weekend Edition" sitting in for Diane Rehm. Facebook is coming under fire this week after allegations of political bias in the way it curates the top trending news stories that show up in the top right corner of your Facebook page. The company denies the charges. This hour, we're going to talk about the power of social media giants to influence political discourse and whether there should be greater transparency.

MS. RACHEL MARTINJoining me in the studio, Cecilia Kang of the New York Times and Jennifer Golbeck of the University of Maryland's social intelligence lab. And from a studio in St. Petersburg, Florida, media ethicist, Kelly McBride of the Poynter Institute. Welcome to all of you.

MS. CECILIA KANGGlad to be here.

MS. JENNIFER GOLBECKThanks for having us.

MS. KELLY MCBRIDEThank you.

MARTINAnd, of course, we want you to weigh in on this. We'll be taking your comments, questions throughout the hour. Call us at 800-433-8850. Send us your email at drshow@wamu.org. Of course, join us on Facebook or Twitter. So Cecilia, let's start off with reviewing how this whole story came to be. It began with a piece that came out in Gizmodo. What did that piece allege?

KANGSo Gizmodo, which is a tech news site had a story that was published on Monday quoting Facebook, former Facebook employees or contractors, if you will, who worked on that feature that you describe, the trending news box, which is on the upper right-hand corner of the page of Facebook and those curators who used to work at Facebook said that they routinely suppressed or injected stories that were featured in that trending box and many times, they suppressed news stories of interest to conservative readers.

KANGSome of those trending topics that were excluded included the topics of the CPAC gathering, a right wing group, Rand Paul, Mitt Romney and they were also -- on the other hand, that's the second thing that they were -- that these curators that they did was they injected and in some cases, were told to inject stories that were of interest that were not necessarily popular within the Facebook site, but that could've been of interest. And those topics could include like Black Lives Matters.

KANGSo there is allegations of political bias within what is being decided that appears in that trending news box. This is a box that anybody can click on, on these headlines, to see more news and stories. That has incredible influence over what kind of news is actually distributed and consumed by the general population.

MARTINAnd the big deal in this is that, as consumers, when you look at that trending box, it's supposed to be this objective source of news stories. Trending means this is a story that is really popular among all the users and there's some algorithm that's figuring it out. It's not humans. That's what we believe the trending box to reflect.

KANGThat's right. And this is actually the case with all social media sites. The social media sites that have become incredibly powerful distribution platforms all say that they are very much an algorithm-based platform to distribute news and for people to consume news. Their own algorithms surface up what is most popular within their sites. And what the big reveal is from this Gizmodo story was that actually, there's a small group of humans who were hired to curate, in some way or another, what is trending, perhaps, within the site, but to make some very interesting editorial decisions.

KANGHow to write the headlines, how to write the descriptions. In this story that Gizmodo wrote, the people who used to work there said that actually they were making beyond just writing the headline and beyond just writing the description, choices on what kinds of stories would appear in that box. That is very much closer to what traditional media does. Traditional news organizations. Facebook, Google News, Twitter, Snapchat and LinkedIn, they would all say that they are technology companies.

KANGAnd the reason why this really has sparked a lot of interest and really a firestorm is that it actually speaks to the question of, well, actually, you're becoming powerful news distribution organizations as well, very much closer to what traditional news media does.

MARTINJenn Golbeck, is Facebook a media company in denial about its being a media company?

GOLBECKI don't even know how in denial they are about it. If you look at Twitter, for example, Twitter is a lot smaller, right? Facebook has about 1.6 billion users every month. Twitter has about 300 million so it's a lot smaller, but it's much better at surfacing trending news. Facebook wants people to come to Facebook for that, right? They want you on that site as much as possible and so they're really pushing very hard, I think, in a kind of explicit way that they want you on their site looking at news.

GOLBECKAnd because of that, they want to identify the trending news. They want to find reliable sources of news, which is a problem across social media sites, and show it to you so it's a place where you stay.

MARTINWhat's the difference between getting your news from the Facebook news feed that your friends might be forwarding around and that trending box that we're talking about?

GOLBECKYeah. So the news feed is from your friends and others, an algorithm that picks what goes in there, too, right? And Facebook is very opaque about all of these algorithms so we don't really know how each of them work. But what you see in your newsfeed from your friends is what your friends have chosen to post. I don't believe that we have curators who are working on that simply because of scale, if nothing else. That's probably entirely algorithmic.

GOLBECKBut that trending box looks across Facebook. Now, it's potentially limited to your region, right, so we in the U.S. don't see a lot of, like, trending stories in India, even though there are a lot of Indian Facebook users. And you'll see, for example, we're based in D.C., that I'll get trending news stories about, you know, what's happening in Rock Creek Park, right? And so there's some kind of curation there, but that's really looking across what lots of people, not just your friends, are sharing and surfacing for what people like you would be interested in seeing.

MARTINSo we should note that Facebook declined to be on the program today, but a spokesperson referred us to a statement by the head of Facebook's trending topics team. His name is Tom Stocky and you can find a link to it at our website, drshow.org, but I'll read an excerpt from the statement. And it says, "Facebook is a platform for people and perspectives from across the political spectrum.

MARTINThere are rigorous guidelines in place for the review team to ensure consistency and neutrality. These guidelines do not permit the suppression of political perspectives, nor do they permit the prioritization of one viewpoint over another or one news outlet over another. These guidelines do not prohibit any news outlet from appearing in Trending Topics." Cecilia, you recently, this morning, spoke with Facebook. What can you tell us about how they are looking into this?

KANGYes. They are definitely still investigating this. They vigorously deny the Gizmodo report and they're looking into these allegations raised by anonymous sources in that story. They say that they're continuing to investigate. They do have -- and they're taking this very seriously. They have very strong guidelines for their review team, is what they described them. It should be noted that these guidelines are not public and I think it would be useful for Facebook to be able to say, like, what these guidelines are.

KANGI think one of the problems is that they're facing an absence of trust because they're not being transparent about how these human curators are making the choices and what kinds of choices they're making. So these guidelines are in place, they say, that if they did find evidence of political bias or any sort of editorial decisions that were suppressing or highlighting particular point of views, politically, that those are fire-able offenses.

KANGAnd they talked about how -- a little bit about what these curators do. They write the headlines. They write the descriptions that appear in the trending box. They make sure the topics align with what's really going on in the real world. They don't want duplicated stories. They also don't want hoaxes to appear, hoax stories. They don't -- also want to make sure the stories make sense, the items that are trending. If they didn't have humans who are making some of the editorial decisions, for example, you might see the topic, "lunch" appear eastern time at noon every single day because those are the kinds of stories that people are sharing within Facebook.

KANGSo trying to keep it a clean experience, is what they say. The other thing that they noted is that there actually is a lot of conservative content that appears. Donald Trump, for example, had 100,000 interactions in this week alone and Hillary Clinton had about half of that amount.

MARTINWell, I want to bring in Kelly McBride into the conversation who studies the kind of gray matter, the ethics of journalism. Kelly, how do you see this? What is so risky about what we're talking about, Facebook as a media company?

MCBRIDEWell, I don't think there's anything risky about it. All media companies are going to be imperfect. But the reason that this story has caught fire so much is because everybody is so distrustful of Facebook. Facebook has always said it's an algorithm. It's just an algorithm. And we can't you tell you what's in the algorithm because then people would spin the algorithm so we have to be super secretive about that. And everybody suspects that there's actual humans manipulating the content both in the newsfeed and in the Trending Topics.

MCBRIDEAnd so this is sort of like a spark that caught fire. I think Facebook's explanation is pretty reasonable, that they have these standards. Now, I agree, there's no reason why they can't release these standards, that, in fact, it would help them because it would make the public more trustful. But they don't release it and there is this mystery behind what they do and then there's also this incredible suspicion that we know Facebook, at least, has the power to manipulate the marketplace of ideas.

MCBRIDEAnd so if you think about that in an abstract way, that means that they can manipulate democracy. And that terrifies everybody, right? Nobody wants that. And the difference between a journalism company and a media company is that a journalism company takes that responsibility very seriously and they don't necessarily do everything perfectly, right? Like, journalism companies, there's lots of ways to criticize them so I'm not saying that they're better.

MCBRIDEBut I'm saying that they outwardly took the responsibility much more seriously than a media company would.

MARTINOkay. Much more to come. We are talking about Facebook and its role as news curator, what that means for you in how you consume news and information. Stay with us.

MARTINWelcome back. I'm Rachel Martin with NPR's "Weekend Edition" sitting in for Diane Rehm. We're talking this hour about Facebook and, more largely, social media's responsibility as a curator of news. And I'm joined in the studio by Cecilia Kang, she's a reporter for The New York Times, also Jennifer Golbeck, she's director of the Social Intelligence Lab and associate professor, College for Information Studies at The University of Maryland. And also Kelly McBride joins us, she's a media ethicist, rather, and vice president for academic programs at The Poynter Institute.

MARTINAnd I want to read an email, because we've gotten a few people who have weighed-in on this point. This is an email from Kathy. She says, I don't understand what the problem is. Facebook is a private company. Even if they are screening stories, how is that different from Fox News or MSNBC? And this is an interesting point, right? They are making editorial choices. These are things that news organizations make all the time, Kelly.

MCBRIDERight. Right. And it's no different than Fox News or MSNBC or The New York Times, right? Everybody makes editorial choices.

MARTINOr NPR...

MCBRIDERight.

MARTIN...we're making choices. Yeah.

MCBRIDERight. The one difference is that Facebook claims that it's not doing that. That's the biggest difference is they say, no, no, no, no, no, it's just an algorithm and the algorithm is based on -- and keep in mind, we're talking about two different things here -- the Gizmodo story was about the trending topics, but what everybody's really suspicious about is the news feed in general, the big part of the screen when you're looking at Facebook. And, yeah, if they're making choices, they have that right to do that. There is a First Amendment and they are publishers, basically, they're publishing or republishing information. They have every right to do that.

MCBRIDEWhat makes many of us who are concerned about democracy so nervous is that we don't know how they're doing that. We just really have no idea. And Facebook is the 800-pound gorilla in the marketplace of ideas right now. It's where everybody goes to trade ideas and opinion.

MARTINSo tease that out a little more. When you say, those of us who are concerned with democracy, I mean, that's a big statement. So how is this -- Jennifer, how is this threatening, in some way, our public discourse as a democracy?

GOLBECKSo I want to add, as a starting place, that there's really good reasons that Facebook has to filter this news. So one thing, for example, is that if you remember when the swine flu was a big deal?

MARTINMm-hmm.

GOLBECKThere were stories circulating all over Facebook that said you shouldn't eat pork because you could get the swine flu from eating port, which is absolutely untrue. And Facebook came under some criticism for not doing something about that, for allowing this misinformation to spread through the network. And so this is the kind of thing that Cecilia mentioned, that they want to filter out hoaxes and inaccurate information and rumors and conspiracy theories.

GOLBECKBut the problem is that once they start making decisions, you know, throwing out all of that stuff that we all can pretty much agree should be thrown out, the question is what kind of power do they have? And so one study that we were talking about during the break is that, you may remember back in the last presidential election that at the top of your page -- you may or may not have seen this -- Facebook would put a box that said, hey, this many of your friends have voted. Did you vote? If so, click this box. And then it would show up for your friends. So they showed that to some users and not others.

GOLBECKAnd they did a research study within Facebook that said, if they showed it to you, you were .5 percent more likely to vote, which sounds like a small percentage but it actually is a lot of people.

MARTINHmm.

GOLBECKAnd you think about it, if Facebook has that power and they know they have that power to get more people to vote...

MARTINWhich, we can agree, is like universally a good thing in this case...

GOLBECKThat's right. So we...

MARTIN...voting, voter participation.

GOLBECKSo Facebook could say, let's show it to everybody to get more people to vote. Or they could say, let's show it only to people like this -- women or conservatives or liberals or a certain minority group or whatever.

MARTINAnd then it's manipulation.

GOLBECKAnd they know that, right? So they could -- they do have the power to manipulate an election. Now I don't think Facebook has any interest in doing that. They haven't shown any interest in doing that. But just the fact that they have that power and know it and we don't know what they're doing and they don't want to talk about it makes all of us very concerned.

MARTINBut this goes to the transparency issue, Cecilia.

KANGAbsolutely, Laura. I think that -- the thing is, Facebook can't have it both ways. They can't say that we are just the neutral platform but also be in the news business. And what I mean by that is understanding that there is a lot of money to be made by being a platform where news is distributed and consumed. And all of the social media organizations are getting more into this. Twitter with Moments. Snapchat has a news division. Apple News. LinkedIn, a big part of their business is actually their news feed. And they've also hired a lot of externalists to do that as well.

KANGSo as these social media platforms that are -- become our -- this moment's distribution platform for news, they can't really have it both ways and say, Hands off, we don't have the responsibility of being in the news business, when we are actually a very powerful part of news and media news consumption, and not be transparent. When you go to The Washington Post or The LA Times or The New York Times, you know that there was a room full of people who decided what was going to go on the front page. People. You know that. That's very clear. You know that when you get to the back of the front section, that there are people in another room with a wall in between the news side, that are on the editorial side, who have opinions.

KANGAll of these things are just sort of practice that people know. People have also made their decisions about the political leanings of particular -- a particular cable news channels as well. So the question then is, as these social media platforms become more in the news business and make money from news, what are their responsibilities? And that's the real tension. And this is why there's such a firestorm, because news -- these social media platforms have such friend-to-me relationships with us in the news business as well, we rely on them incredibly now. We understand their power. And we hate that but we see opportunity in that as well.

MARTINWe should also note that NPR and WAMU, as a member station, are both in a collaboration with Facebook, using its new live applications to distribute video material.

KANGDisclaimered, The New York Times is as well, as many news organizations are.

MARTINBut, Kelly McBride, I mean, is this the real crux of the problem? Because there's nothing to compel Facebook to all of a sudden adopt some kind of code of journalism ethics, something that all of us are very attuned to. It's -- there's no -- there's nothing to incent them to do that.

MCBRIDEWell, and there's really nothing to incent media -- journalism companies to do that either, right? Like because the First Amendment specifically says that there's no government interference of the press, it is a self-regulated industry. You know, the -- one of the Senate committees has sent Facebook a notice that like they've got some questions to ask. But that's really an idle threat, because there's -- there is no force other than the people themselves. And all that could happen would be that people would stop using Facebook.

MCBRIDENow that's a very -- even though that doesn't seem likely now because we're all so addicted to Facebook and we get it on our phones and it's super easy and useful, that's a real threat. I mean, that -- all it would take is some other social media platform to disrupt Facebook just enough to start the trend, start the mass exodus, that Facebook doesn't want that to happen. So Facebook really has its own business motivations to keep the public trusting Facebook. Whether they're making the right decisions about how to do that, you know, we could argue left and right, and I'm sure they argue internally, about how to get people to trust them.

MCBRIDEBut it's in their business interest to continue to get people to trust them. And there's no way that an outside regulator could impose any sort of regulation on them.

MARTINAnd the fact of the matter is, there is no competition for Facebook. They're the only one in the game right now.

MCBRIDEWell they tend to buy up anything. Like, I mean, Instagram is the closest thing. But who owns Instagram now? Facebook.

MARTINYeah. And I want to delve into a little more about how these algorithms work. We got an email from Joey who says, hasn't it just been studied and released that algorithms tend to have a bias that mirrors the person who made the algorithm? Jen?

GOLBECKSo it sort of depends is the answer. We could talk about this for an hour, but I'll try to keep it short.

MARTINGive it a short answer, yeah.

GOLBECKYeah. So, look, the shortest answer -- the easiest way you could build this trending-topics algorithm for Facebook is to say let's count, say, hashtags or links, right? Whatever it is that we want to surface. We're just going to count how many times they were shared and we're going to put them in order and we're going show the top 20. Right? I mean, that's a simple algorithm. Is there any bias in that? No, right? I mean, a programmer hasn't been biased. They say we're going to show the top 20 and we do. And what do you get? A lot of garbage, when you do it that way. You get lunch, right? You get -- and you can sort of see this on Twitter, too.

MARTINJustin Bieber.

GOLBECKJustin Bieber. Yeah, you get, you know, all kinds of these weird hashtags that sometimes surface on Twitter. You don't know what they're about. And so you get not very engaging or valuable stuff there. And so you want to kind of control for that. And there's good reasons. Aside from the irrelevant content, you can get super-racist content. You can get kind of trolls manipulating it to surface really terrible things. We've seen all of this. And so people come in and they say, all right, we're going to tweak the algorithms to try to get rid of the garbage and find more interesting stuff.

GOLBECKSo there are biases that come in, but it's not as straightforward as, say, oh, say, you have a racist or a sexist programmer. They end up including features that, say, highlight racist or sexist content. It gets a lot more complicated than that, where they can say, let's consider zip code, which can map very closely to race. And suddenly the algorithms start seeing race in a way that no one actually intended but that you have to think through.

MARTINBut inevitably we keep circling back to the fact that, if we knew how Facebook was making these decisions, if we knew what these curators were empowered to do, if we knew the human element of the algorithm, probably it would engender a lot more trust in users.

KANGI think trust is absolutely the key and the reason why this has sparked such incredible interest and as well as, you know, interest on the Hill by, you know, Senator John Thune who sent that letter to Mark Zuckerberg saying, here are the questions I want to ask. Maybe an idle threat. Definitely politics. But what that shows, what that illustrates is it taps into a real frustration in the general public and particularly among Republicans as well, that if -- and actually -- and Democrats as well in this election cycle who have said over and over again, the system is rigged, the media is against us. We are seeing so much bias in the news. And it just was -- it's confirmation bias.

KANGAnd, in fact, when I looked at the response by Tom Stocky, the executive of Facebook, when he outlined in a pretty long explanation of how it works, defending the practices and denying a lot of the allegations in the Gizmodo story. What was more interesting was all the comments, so many comments from people saying, this is exactly what I suspected. This is, you know, you cannot deny it now. The news is out. You're -- the game's up Facebook. Now we know.

MARTINThe gig is up, yeah.

KANGAnd it's just, if you don't engender that sort of trust, if you don't give more information for people to actually make these decisions, you get that kind of response.

MARTINOkay. Much more ahead. I'm Rachel Martin with NPR's "Weekend Edition." And you are listening to "The Diane Rehm Show." If you'd like to join us, call 1-800-433-8850. Or send an email to drshow@wamu.org. Find us on Facebook or send a tweet. We want to hear what you think about how you get your news from Facebook and how Facebook is making those choices. Jennifer.

GOLBECKI wanted to follow up on this point of transparencies. I'm the computer scientist in the room, right? I'm not a journalist. It's really hard to be transparent about these things. So I think, certainly, being transparent about the humans and their guidelines absolutely should be done. Being transparent about the algorithms is very hard.

MARTINHmm.

GOLBECKOne, they're not easy to understand even to computer scientists. Like, I look at these algorithms, they're extremely complex. And there's lots of them at play at once. And this is interesting, like you were talking before, you know, about how they can be transparent and explain how this is working and transparency is important.

GOLBECKAnd tech companies have exactly the opposite side of that in their culture. They say, well, we're always experimenting. We're always changing. We have lots of versions of these algorithms running for different people all the time. You shouldn't feel bad that you're being experimented on. You know, this is how we run our business, that we try experiments, we change stuff and so we can't possibly tell you.

GOLBECKAnd, in fact, when other ethical concerns have come up, Facebook has really pushed back and said, well, you opted in to use the system. So there was a big outcry a couple summers ago where they did this, what they called an emotional contagion study. They showed some people only happy news from their friends.

MARTINMm-hmm.

GOLBECKAnd they showed some people only sad news, to see how it affected what they posted. Researchers like me in the academic community freaked out about this. This is completely unethical research. No one opted in to this. There was no informed consent. We have all kinds of protections in the academic space. And Facebook said, well, if you look at our terms of service, there's a line that says, your data may be used for this, this, this, and research. And so then people have consented...

MARTINYeah.

GOLBECK...to be in this experiment.

MARTINIt's a private company.

GOLBECKYeah. And on one hand, there's no law against this. On the other hand, the community of researchers says this is unethical and shouldn't happen. And the corporate response is, we can do whatever we want and you shouldn't be upset about this. So that's a different ethical concern. But I think you get this same kind of cultural response here, in the journalism space, where we're concerned with those ethics.

MARTINWell, and we should point out, we are journalists. We care -- we have a different level of interest in this story, because we think of what we do as this public good, especially in public media. And Facebook is essentially saying, hey, we don't -- we're not purporting to be journalists. And if you sign up to view our content, you know, you're signing on to the deal. And it's not the same thing as listening to public radio or reading The New York Times. Kelly.

MCBRIDEYeah. You know, I actually think the public's really interested in this, too. Yes, because we're journalists, we are super interested in this. But the other reason that we're interested in this is we all did these stories following the Gizmodo story and they spiked like crazy on our sites. So we use these same algorithms to tell us what we're -- what people are interested in. And the public is definitely interested in this because, in the same way that they have been suspicious of news media, they are suspicious of Facebook.

MARTINSo let's go to the public. I want to bring in Heidi of Portland, Maine. Heidi, you're on the air.

HEIDIHi. Great. This is my first time on "The Diane Rehm Show." Thank you so...

MARTINOh, I'm sorry Diane's not here to say hi. But I'm glad you're here.

HEIDIAh. Well, I love the show. We're very grateful to have it on the new -- at live today.

MARTINGreat.

HEIDISo that we can call in. I listen on MPBN and it's wonderful.

MARTINHappy to hear it. You got a question about Facebook?

HEIDIYes. I'm a social and environmental advocate with about 1,200 friends from music, politics, nonprofits, science and my childhood. And I, first, want -- I've got four different concerns. One is that I was -- I found that Obama's inauguration in the Cairo peace talk, do you remember that they actually offered you the chance to watch the event and then have a chat in the side with people from all over the world. And I made some of my closest friends now with those people. And I don't see that being offered anymore. That's one. First, and there's a...

MARTINWe might only get to two of your concerns before the break, so.

HEIDIOkay. Well, then...

MARTINKeep that in mind.

HEIDIThere's a 500 friend limit for events.

MARTINMm-hmm.

HEIDIAnd I'd like to invite all of my friends. And you can't search for global events like you used to be able to. And you now have -- can barely find anything in the events in your neighborhood. And as somebody that produces public events, I would like all of my friends from throughout the country to be able to see what's going on and be able to come, especially the radio show -- radio astronomy program that I produce. I'd like...

MARTINYou'd like them to be able to hear that. So the caller is talking about frustrations with Facebook that are very broad, Cecilia.

KANGShe certainly illustrates the importance that the network has for so many people. And that is why we're even having this discussion, it's so powerful on its scale.

MARTINYeah. And we'll talk more about that. You are listening to "The Diane Rehm Show." Stay with us. Your calls and questions coming up next.

MARTINHi there. Welcome back. I'm Rachel Martin with NPR's "Weekend Edition," sitting in for Diane Rehm. We're talking about Facebook as a curator of news. And I'm joined in the studio by Cecilia Kang. She's a reporter with the New York Times. Also Jennifer Golbeck. She is director of the Social Intelligence Lab at the University of Maryland. We've also got Kelly McBride with us. She is a media ethicist and vice president for academic programs at the Poynter Institute.

MARTINAnd I want to begin this part of the conversation with a comment or a question that we've gotten from several listeners. This one comes from Erin. She wrote into the DR show website. She says, "I'm stunned that you're wasting your time with this topic. Especially since all the sources of this allegation are anonymous. How can we verify the truth of these accusations?" Cecilia, she's talking about that Gizmodo piece that started this.

KANGThat's right. There are two pieces. The piece on Monday was really heavily relying on one anonymous source. Another story that was published earlier this month by Gizmodo that also talked to some of these topics, but more broadly quoted several anonymous sources and former contractors with Facebook. So the anonymous sourcing is certainly something that people have brought up. Again, that said though, these sources have said to Gizmodo that they signed non-disclosure agreements with Facebook that they would not talk about the company and what happened within the company from their time there.

KANGSo they're contractually bound not to do that. So they're afraid of -- for that reason. But I do think that it's, you know, the reason why we're all talking about this, the reason why there has been such a huge press cycle around this, because this taps into so many things. It taps into already a frustration in the public, as well among political leaders, that the media has not done justice in this election cycle. That there is -- there are inherent biases that have come out and have heavily influenced the election.

KANGSo there's already discord in that way. There's also -- it also taps into a suspicion and a concern that the media industry has about their business models and their over reliance on social media. And it's a very tense relationship. Thirdly, I would say that one problem that these companies have is that their leaders have been very outspoken about their own social causes and their own political leanings. And so it's hard…

MARTINThey, as individuals, are using social media to promote causes.

KANGThey as individuals, absolutely. Or the CEO's, I'm sorry, of these tech companies. So Tim Cook, for example. Apple has a news product. Tim Cook is very vocal about his views toward marriage equality rights, about equal justice, Mark Zukerberg and Sheryl Sandberg, both of Facebook, have also been very clear about their political leanings and their social causes. Eric Schmidt is the chairman of Alphabet, formerly Google, was also very closely aligned with many liberal causes. And he has had -- he actually has different positions, informally, within the administration. So there is already suspicions.

MARTINBut they're private citizens. Are they not allowed to hold opinions and talk about them?

KANGThey are not -- they're not -- they are certainly private citizens. I think that once -- and especially as tech companies, that's just fine. You have your own political leanings. But when you are the head of a media organization, then people think about you at the top of the masthead differently. And perhaps -- or at least you have to be clear about what your role is.

MARTINKelly McBride?

MCBRIDEWell, and most media -- most journalism organizations have pretty clear policies about who can and can't have political leanings. And so there's usually a -- we talk about it as a separation of church and state. You know, the publisher, if it's okay if he has -- 'cause he's the head of the business side of the company -- it's okay if he gives to political campaigns and makes political statements. And often they do. The editor, not so much. You know, it's generally considered not a good idea.

MCBRIDEThe Huffington Post actually got into a really interesting kerfuffle recently because Arianna is going to serve on a board of a company that they cover all the time.

MARTINOh, yeah, of Uber.

MCBRIDERight. And so that was -- that's like another tech/journalism company that has gotten into a little bit of trouble because they don't have these well-established cultural expectations of independence.

MARTINI want to go to Derek in Westmoreland, N.Y., who's got a question. Hey, Derek, you're on the air.

DEREKHi. Thanks for taking my call.

MARTINYou bet.

DEREKMy question concerns revealing the standards and practices used in the algorithm. If that happens wouldn't that open it up to individuals, corporations, politicians, whoever, to use those standards to trend themselves?

MARTINTo manipulate the algorithms.

DEREKWouldn't that -- right. Wouldn't that take the power away from the…

MARTINYou are a savvier user than I, Derek. But I think your point is a good one. Thanks so much. Jennifer, what do you think about that?

GOLBECKYeah, so that's a big reason that these companies say that they want to keep things private. And we've certainly seen this happening. So there's this great term called astroturfing, which is like my favorite. So it's like grassroots, but fake. Right? Hence, astroturfing.

MARTINOkay. Got it. I think.

GOLBECKAnd it's, it, yeah. So it's basically people going on to social media, you know, and also blogs and things, but especially like Twitter and Facebook. Conservative causes have been especially good with this. And they start making posts so it looks like there's a lot of grassroots support for a candidate or a cause, but actually it's a bunch of fake accounts or a few people who are controlled by a campaign or an organization.

GOLBECKSo they make lots of, you know, hundreds of Twitter accounts that are all posting different things about support this bill or support this candidate. So, hey, it looks like there's all this grassroots support and actually it's completely fake. Right? It's just support from a bunch of accounts run by a certain organization. And this is exactly the kind of thing that you can do, you know, imagine having thousands of Facebook accounts saying something about Donald Trump, not that he needs more coverage than he's already getting. But then suddenly you can start seeing that trending 'cause lots of people are talking about it.

MARTINKelly McBride?

MCBRIDERight. And it's entirely possible that what this whole thing is about is Facebook was trying to prevent that, trying to prevent those efforts from trending in the trending news. And it was interpreted by the staff as an anti-conservative viewpoint.

MARTINAlthough, again, we would know that if they were more transparent about how…

MCBRIDEIf they were more transparent, right. But who knows?

MARTIN…they do their business, yeah. What about this idea some have floated, Cecilia, about Facebook perhaps appointing a public editor, an ombudsman of sorts who could be the voice of the people, who could represent the audience or the readership and how Facebook is putting information in front of them?

KANGSo that would be a full leap into news. That's a full leap into Facebook and other social median organizations saying we are journalism organizations as well. You know, it's an interesting question because curation of news is very powerful, as well as the creation of news. They don't create their own content, but they curate news. And so they say that we're not actually journalism organizations.

KANGBut as we found, curation is really important. Like, we don't know exactly what goes into the decision making of Twitter Moments, for example. We know that humans make those decisions, but, you know, why are they…

MARTINExplain what Twitter Moments are.

KANGTwitter Moments is a feature within Twitter where they curate, basically, the hottest things that are trending. And they decide what is -- and it's sort of -- it's similar to trending in Facebook, in that it's a combination of humans as well as the algorithm. And they post the best tweets that illustrate whatever that trending topic was about, whatever -- the tweets that were within their universe. And so to have a public editor -- and so those kinds of things -- the Twitter Moments, Apple News is a curation site.

KANGGoogle News, which seems a little bit outdated. It's been sort of the granddaddy of them all, it's been here -- there for -- around for quite some time. It has also caused a lot of consternation over the years. They say they're fully algorithmic, there's no human involvement at all. So I think that it would be a full leap into saying that you're a journalism institution. And then you have to really think about what your relationship is with the public. What is the social compact with, not just your users, but with the public in general?

MARTINKelly McBride, does Facebook have a social compact with its audience?

MCBRIDESure, they do. It's unwritten, but they absolutely do. And it just depends on how seriously Facebook wants to take that contract with the audience and whether they want the audience to see them as a trustworthy affiliate, as opposed to just a tool. Now…

MARTINDo you think they should appoint a public editor?

MCBRIDEAbsolutely. Absolutely. I mean, I said that. The very first thing that I wrote in the wake of this, I said that. And I just wrote a tongue-in-cheek job description for them. It's on our website at Poynter.org if anybody wants to read it. But, yeah, I think…

MARTINYou're not looking for a job, though, we should point out.

MCBRIDENo. Not for me. Not for me. I know, my God, that would -- it would be one of the hardest jobs in the universe.

MARTINIt would.

MCBRIDERight? Like, you could not pay anybody enough money because you would make no friends, you would be incredibly lonely. It would be a tough, tough job.

MARTINJennifer?

GOLBECKIt's interesting to me to hear this discussion from the journalists in the room, right, again, as a non-journalist. Because I don't think Facebook necessarily sees themselves as having a journalistic function there. Like, they have people doing it, but they're making money that way. And other companies are making money that way. And I think if you appointed this person, what they would really be dealing with is not the average person who says I want to know this.

GOLBECKThey're gonna be dealing with companies who are making or losing money based on their decision. So as one quick example, you used to see all of these click-baity headlines. Right? You still see plenty of them, but you used to see a lot more on Facebook. Oh, you'll never believe what happened next. Right? Click here to go to this other site to watch the video. People…

MARTINYeah, I never click on those, ever. Right.

GOLBECKAnd, well, this is the thing, people didn't like them. Right? Like, they're tempting to click on, but people would get irritated because, like, it's not that interesting what ends up being there. Right? They would get frustrated. And so it was decreasing engagement. Facebook makes money on engagement and keeping you on the site. So they made a conscious choice to adjust their algorithm to not highlight those stories as highly, both in trending and in the newsfeed, which Kelly has mentioned.

GOLBECKAnd so you imagine you're a company who does click-bait. Right? Like, you're Upworthy or one of these places. And suddenly Facebook changes their algorithm so you're not highlighted as much. What are you gonna do? You're gonna go to this ombudsperson and talk to them about the algorithm and the choices that they're making. And I think there's so much money driving these other media companies who are getting eyes on their content through Facebook that that's the main person an ombudsman would be dealing with, not the average reader.

MCBRIDEYeah, but, I mean, what the ombudsman would do then is say, look, Facebook's loyalty is to its users and so here's why we're deprivileging click-bait in favor of something else.

MARTINSo you think that their loyalty is to their users?

MCBRIDEWell, they -- it should be. That's how they're gonna make their money. Right? I mean, if the users all turn away then…

MARTINCertain users, perhaps.

GOLBECKThey make their -- but that's the thing, right? They don't make their money from their users. We don't pay anything. They make…

MCBRIDEAdvertisers.

GOLBECK…their money from advertisers who are exploiting our information. So sure, I mean, if all the users left, Facebook would lose money. But I think their loyalty is much more to their advertisers and the companies they partner with than their users.

MCBRIDEAny platform can be disrupted. I mean talk to newspapers who didn't think they'd ever be disrupted. Any platform can be disrupted if they -- if their audience loses trust in them.

MARTINBut it sounds like we are just on the edge of technology. That Facebook is perhaps just pushing us into this conversation and they are evolving and trying to figure out if they're trying to be too much to too many people.

MCBRIDEAbsolutely.

MARTINStay with us. I'm Rachel Martin, with NPR's "Weekend Edition," and you're listening to "The Diane Rehm Show." And we're continuing our conversation about Facebook and its responsibilities, if it has any, to its audience, to its readership in terms of the information that it's pushing forward. And I want to go to another caller. I want to bring in Michael of Baton Rouge, La. Hey, Michael, you're on the air.

MICHAELHey, how you all doing today?

MARTINWe're doing well, thanks.

MICHAELCool. Just thought I'd offer you a little perspective in terms of whether it's the users or the companies that really make the decisions. 'Cause I've looked into it a lot. I've just noticed Facebook, and in general a lot of social media, really going off the wall. And so looking deep, deep into it, all you have to look at is these PR firms. And all you have to go is to their website. These big, big PR firms, like Edelman, they brag about how Facebook is just the best platform that they could ever possibly use to market their material.

MICHAELAnd other than even marketing their material, to do damage control. You know, say Bob's shampoo pays the PR company to monitor their Facebook presence. If someone anywhere says, hey, Bob's shampoo made my kids hair fall out, they control whether anyone sees that or not. Like there is literally someone who decides, okay, nope, only their friends can see that. So you could be trying to spread the word about anything and it's literally the PR companies who are getting paid to directly deal with Facebook to do this. And Facebook is apparently the best platform that they could use for this 'cause they brag about it.

MARTINYeah, thank you. Thank you so much, Michael. Let's put that to our guests in the studio. Jennifer, what about that? What about Facebook and its role as an advertising platform?

GOLBECKYeah, so, I mean, it certainly is one. That's where they're making their money. And I think part of the reason that you're seeing these social media companies push into something that looks like journalism, curating news, is because they want to keep you on their site. They want to be the place that you come. I think Twitter has done a really good job of that, actually. Twitter is where I go to get a lot of news. You know, there's a couple of sites. I'll go to the New York Times and the Washington Post, read the headlines. But then most of the rest of my news comes from Twitter because I have curated people who post that.

GOLBECKSo that's why they want to do it. Right? I go to the site to get news from there. But I think it's interesting, you know, coming back to this point that Kelly made, which ties in with this, are you gonna lose the trust of your users if you're curating the news in a certain way, if PR companies are controlling what you see. And I think the question then becomes why are people using these sites.

GOLBECKYou know, are we using it to get news or are we using it for -- what I think a lot of people use Facebook for, which is like socially interacting with our friends. Like, I see my friends' dog pictures and, you know, where they went on their date last night. And the news happens to be an aside to that. If Facebook started posting terrible news, I would still go to it because my friends are there. Right?

MARTINBut is the risk -- if we're only getting news that our friends give us, then does that put us in some kind of information bubble that can be dangerous in some ways, in terms of getting people exposed to a variety of ideas? Kelly?

MCBRIDESure, absolutely. And as a, I mean, and what we're doing as a society is transferring the responsibility to have a well-balanced diet of news to the consumer. We used to place that responsibility on the distributor or the provider, and now we're saying, hey, if you want to be a good citizen, it's on you to make sure you have a wide ranging diet of news sources. And, you know, not everybody's there.

MCBRIDEBut I think most people are getting there. There is a certain portion of the population that has been polled and said no, I want my news to reflect my own filter. And, you know, that is generally right around 30 percent. It stays pretty consistent. But that is now a responsibility of a citizens, not necessarily distributors.

MARTINCecilia, I'll give you the final word this hour.

KANGYeah, I would say that that's been going on for some time. We've -- you, as an individual, make your choice on what cable -- for what cable channel, for example, you decide to watch. You decide what newspaper you decided to subscribe to. You -- we've made our own decisions for quite some time. It becomes easier to have a narrower filter and more filters on what you get online because it's just so -- there's so much information. And it's easy to filter that.

KANGIt is the responsibility of the consumer. I'd like to give people more credit, too. That they are good about trying to be informed people. I'd like to give people more credit. And I think that it's a great experiment to see what kind of diet you're gonna get, of information, on a daily basis when left to your own devices and not told these are the five most important stories you should know at the top of the hour and have that fed to you. I think it's a marvelous experiment. I think also, we have no choice, as news organizations, to go along with it. The train has left the station.

MARTINAnd as consumers.

KANGThe train has left the station.

MARTINAnd just real quick, Jennifer, do you think Facebook -- this is gonna move Facebook in any direction?

GOLBECKI think probably not for a while. I think they're really driven by the money and they want to finds ways to keep engagement. And if they think keeping things opaque is gonna do that for them, they'll continue to do it.

MARTINWell, it's been a fascinating conversation. We've been talking about Facebook, whether or not it has a responsibility in terms of the news that it is curating for you. We've been joined in the studio by Cecilia Kang, reporter for the New York Times, and Jennifer Golbeck, she's the director of the Social Intelligence Lab and associate professor at the University of Maryland. She's also the author of a book, "Introduction to Social Media Investigation." And Kelly McBride, media ethicist and vice president for academic programs at the Poynter Institute and the author of "The New Ethics of Journalism: Principles for the 21st Century."